Instagram For-Profit Accounts Posting Faulty Health Information

4 min read

Nov. 9, 2022 -- Nearly 1 in 4  popular posts on Instagram about hepatitis B contained misinformation, and these posts were far more likely than accurate posts to come from for-profit accounts or accounts selling a product or service. 

That's according to new research recently presented at the American College of Gastroenterology 2022 Annual Scientific Meeting in Charlotte, NC. 

"Users who generate hepatitis B misinformation also have greater reach with a higher number of followers and engagement with more likes than users who do not promote misinformation," said presenter Zachary C. Warner, MD, MPH, an internal medicine resident at the University of Arizona. "It is possible that patients with chronic health conditions — conditions that do not have simple treatments — are vulnerable to online health misinformation and for-profit users."

While misinformation and skepticism about evidence-based medicine have become more prevalent online, patients are turning to social media and other user-generated sites for information and support about their health, according to Warner. 

"While these sites are useful because they can provide access to social support and information that patients otherwise would lack, medical information on social media is unregulated," he warned. 

Unverified Intel

Though the consequences of exposure to online misinformation are not well studied, negative effects are possible. 

"Adoption of unproven cures and symptom management may increase patients' risk for poor health outcomes and financial hardship," Warner said. "Unproven cures and symptom management approaches are unlikely to be covered by health insurance, potentially leaving patients paying high out-of-pocket costs."

Warner and his team restricted their search of Instagram to a snapshot of 1 month in December 2021. They searched for all publicly available posts that mentioned hepatitis B and hep B. After removing duplicates from the top 55 posts for each term, they coded the remaining 103 posts with a validated tool for assessing misinformation. The tool's variables included engagement, such as likes and comments; user characteristics, such as number of followers; and claims with misinformation as judged by medical experts.

Then the researchers analyzed the relationship between profitability and misinformation among the posts. Almost one quarter of the posts (23%) had misinformation about hepatitis B or its treatment. These posts also had greater average engagement with 1,599 likes compared with posts with accurate information on hepatitis B, which had an average 970 likes. Accounts with posts containing misinformation were also following a higher average number of accounts (1,127) than those with accurate posts about hepatitis B (889 accounts). But the accounts posting misinformation had about a third as many average followers (22,920) as the accounts posting accurate information (70,442 followers).

"We believe it is wise to maintain a hefty level of skepticism for information that promises outcomes that are 'too good to be true,' use anecdotes for support, or are experimental," Warner said. "We recommend the CRAAP test, which guides individuals to evaluate sources of health information." 

Does This Pass the CRAAP Test?

  • Consider the Currency of the information
  • The Relevance to your needs
  • Authority of the source
  • Accuracy of the content, and 
  • The Purpose the source exists

The researchers found in their study that just under a third (30%) of the hepatitis B posts referenced a conspiracy theory and a similar proportion came from for-profit accounts (29%). And just over a third (34%) of the posts came from accounts that were selling a product or service through Instagram.

Overall, more than three times as many posts with misinformation came from for-profit accounts (47%) than posts with accurate information (14%). A similar proportion of posts with misinformation (43%) came from accounts selling a product or service, compared to accurate posts (13%) from accounts selling a product or service.

These findings were not surprising to David Gorski, MD, PhD, professor of surgery at the Wayne State University School of Medicine. 

"Although misinformation about health is often driven by ideology and belief, it is almost always also driven by the profit motive of practitioners who sell treatments based on the misinformation," he said. 

"Most quacks, in other words, do believe in the quackery that they sell, and believers are far more effective salespeople than grifters who know that what they are selling is quackery," said Gorski

"We heavily emphasize that patients do their best to assess the possible drivers behind the individuals or organizations who create the health information viewed online, particularly on social media sites," Warner said, also advising that clinicians and health organizations openly engage in online and in-person conversations about misinformation.

"Hard-core believers in misinformation are almost always unreachable and unteachable, and it's largely a waste of time to try to change their minds," Gorski said. "However, people who are on the fence, who aren't sure, are reachable. We should target our educational efforts at them, not at those selling the quackery."